home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
HPAVC
/
HPAVC CD-ROM.iso
/
NEUCLS3.ZIP
/
NURN.ZP
/
FAQC.HLP
< prev
next >
Wrap
Text File
|
1993-01-07
|
10KB
|
232 lines
This document is divided into two part as follows.
1. Questions About This Package
2. Questions About The Theory Behind This Package
1. Questions About This Package
Q; What capabilities does this software package have that
differentiate it from those developed elsewhere ?
A; This package
(1) Includes a network structure estimation program, that allows
one to estimate how many hidden units an MLP must have to achieve
a user-chosen performance. This program usually, but not always, works.
(2) Includes a fast training program unlike others that are available.
This technique is about 3 times faster than full-blown (without
heuristic changes for speeding it up) conjugate gradient training,
and performs slightly better. Training is 10 to 100 times faster
than backpropagation.
(3) Includes a network structure analysis program. Given a trained
MLP, this program makes a table of network performance versus
the number of hidden units. Using the table, and the non-demo
version of this program, the user can choose the size wanted
and prune the network, saving new weight and network structure files.
The non-demo version can also determine the amount of nonlinearity
(degree) in each hidden layer, thereby informing the user if a
linear network would solve his problem.
Q; Why does this package design only classification networks and not
mapping or estimation networks ? Don't both types of network use
the same format for training data and the same training algorithms ?
A; We have separate packages for classification and mapping or
estimation because;
(1) Our training algorithms for classification and mapping networks have
some important differences. For example, the functional link net
design for a mapping net is not iterative, whereas that for
classification nets is iterative. The MLP classification network
learns even when the learning factor is set to 0., unlike the MLP
for mapping.
(2) Combining the two packages would make the result unnecessarily large.
(3) Many people need to do mapping or classification but not both.
Q; What error function is being minimized during backpropagation training,
fast training, and functional link net training ?
Npat Nout 2
A; MSE = (1/Npat) SUM SUM [ Tpk - Opk ]
p=1 k=1
where Npat is the number of training patterns, Nout is the number
of network output nodes, Tpk is the desired output for the pth
training pattern and the kth output, and Opk is the actual output
for the pth training pattern and the kth output. The desired
output Tpk is 0 for the correct class and 1 for other classes.
MSE is printed for each iteration.
Q; What is the error percentage that is printed out during
backpropagation training, fast training, and functional link net
training ?
A; Err = 100 x (number of patterns misclassified/Npat).
Q; I get an "Out of environment space when deleting a file in the
utilities section. How can I fix this ?
A; Increase the environment space. Add the switch /e:512 or /e:1024
to the shell command in your config.sys file.
Q: The package seems very sluggish when I use a serial mouse. How can
I fix this ?
A; The package should run much faster if you disable the mouse. You can
comment out the mouse command in your autoexec.bat file. Also, a
bus type mouse may work ok. We are still investigating the serial
mouse problem.
2. Questions About The Theory Behind This Package
Q; Do you have any papers related to the prediction of neural net
size (Sizing) ?
A; Classified.
Q; Do you have any papers related to fast training of MLPs, and
related topics?
A; Yes.
M.S. Dawson, A.K. Fung, M.T. Manry, "Sea Ice Classification Using
Fast Learning Neural Networks," Proc. of IGARSS'92, Houston, Texas,
May 1992, vol. II, pp 1070-1071.
M.S. Dawson, J. Olvera, A.K. Fung, M.T. Manry, "Inversion of
Surface Parameters Using Fast Learning Neural Networks," Proc. of
IGARSS'92, Houston, Texas, May 1992, vol. II, pp 910-912.
M.T. Manry, X. Guan, S.J. Apollo, L.S. Allen, W.D. Lyle, and W.
Gong, "Output Weight Optimization for the Multi-Layer Perceptron,"
Conference Record of the Twenty-Sixth Annual Asilomar Conference on
Signals, Systems, and Computers, Oct. 1992, vol 1, pp. 502-506.
X. Jiang, Mu-Song Chen, and M.T. Manry, "Compact Polynomial
Modeling of the Multi-Layer Perceptron," Conference Record of the
Twenty-Sixth Annual Asilomar Conference on Signals, Systems, and
Computers, Oct. 1992, vol 2, pp.791-795.
R.R. Bailey, E.J. Pettit, R.T. Borochoff, M.T. Manry, and X. Jiang,
"Automatic Recognition of USGS Land Use/Cover Categories Using
Statistical and Neural Network Classifiers," Proceedings of SPIE
OE/Aerospace and Remote Sensing, April 12-16, 1993, Orlando
Florida.
M.S. Dawson, A.K. Fung, M.T. Manry, "Classification of SSM/I Polar
Sea Ice Data Using Neural Networks," Proc. of PIERS 93, 1993, p.
572.
F. Amar, M.S. Dawson, A.K. Fung, M.T. Manry, "Analysis of
Scattering and Inversion From Forest," Proc. of PIERS 93, 1993, p.
162.
A. Gopalakrishnan, X. Jiang, M-S Chen, and M.T. Manry,
"Constructive Proof of Efficient Pattern storage in the Multilayer
Perceptron," Conference Record of the Twenty-Seventh Annual
Asilomar Conference on Signals, Systems, and Computers, Nov. 1993.
K. Rohani, M.S. Chen and M.T. Manry, "Neural Subnet Design by
Direct Polynomial Mapping," IEEE Transactions on Neural Networks,
Vol. 3, no. 6, pp. 1024-1026, November 1992.
M.S. Dawson, A.K. Fung, and M.T. Manry, "Surface Parameter
Retrieval Using Fast Learning Neural Networks," Remote Sensing
Reviews, Vol. 7, pp. 1-18, 1993.
M.T. Manry, S.J. Apollo, L.S. Allen, W.D. Lyle, W. Gong, M.S.
Dawson, and A.K. Fung, "Fast Training of Neural Networks for Remote
Sensing," Remote Sensing Reviews, July 1994, vol. 9, pp. 77-96, 1994.
Q; Do you have any papers related to the analysis of trained neural
networks ?
A; Yes.
W. Gong and M.T. Manry, "Analysis of Non-Gaussian Data Using a
Neural Network," Proceedings of IJCNN 89, vol. II, p. II-576,
Washington D.C., June 1989.
M.S. Chen and M.T. Manry, "Back-Propagation Representation Theorem
Using Power Series," Proceedings of IJCNN 90, San Diego, I-643 to
I-648.
M.S. Chen and M.T. Manry, "Basis Vector Analyses of Back-
Propagation Neural Networks," Proceedings of the 34th Midwest
Symposium on Circuits and Systems, Monterey, California, May 14-17
1991, vol. 1, pp 23-26.
M.S. Chen and M.T. Manry, "Power Series Analyses of Back-
Propagation Neural Networks," Proc. of IJCNN 91, Seattle WA., pp.
I-295 to I-300.
M.S. Chen and M.T. Manry, "Nonlinear Modelling of Back- Propagation
Neural Networks," Proc. of IJCNN 91, Seattle WA., p. A-899.
M.S. Chen and M.T. Manry, "Basis Vector Representation of Multi-
Layer Perceptron Neural Networks," submitted to IEEE Transactions
on Neural Networks.
W. Gong, H.C. Yau, and M.T. Manry, "Non-Gaussian Feature Analyses
Using a Neural Network," accepted by Progress in Neural Networks,
vol. 2, 1991.
X. Jiang, Mu-Song Chen, M.T. Manry, M.S. Dawson, A.K. Fung,
"Analysis and Optimization of Neural Networks for Remote Sensing,"
Remote Sensing Reviews, July 1994, vol. 9, pp. 97-114, 1994.
M.S. Chen and M.T. Manry, "Conventional Modelling of the Multi-
Layer Perceptron Using Polynomial Basis Functions," IEEE
Transactions on Neural Networks, Vol. 4, no. 1, pp. 164-166,
January 1993.
K. Rohani and M.T. Manry, "Multi-Layer Neural Network Design Based
on a Modular Concept," accepted by the Journal of Artificial Neural
Networks.
Q; Do you have any papers related to the prediction of neural net
performance, and pre-processing of data ?
A; Yes.
S.J. Apollo, M.T. Manry, L.S. Allen, and W.D. Lyle, "Optimality of
Transforms for Parameter Estimation," Conference Record of the
Twenty-Sixth Annual Asilomar Conference on Signals, Systems, and
Computers, Oct. 1992, vol. 1, pp. 294-298.
Q. Yu, S.J. Apollo, and M.T. Manry, "MAP Estimation and the
Multilayer Perceptron," Proceedings of the 1993 IEEE Workshop on
Neural Networks for Signal Processing, Linthicum Heights, Maryland,
Sept. 6-9, 1993, pp. 30-39.
S.J. Apollo, M.T. Manry, L.S. Allen, and W.D. Lyle, "Theory of
Neural Network-Based Parameter Estimation," submitted to Neural
Network Trans. of the IEEE.
S.J. Apollo, M.T. Manry, L.S. Allen, and W.D. Lyle,
"Transformation-Based Data Compression for Parameter Estimation,"
submitted to IEEE Trans. on Signal Processing.
Q; Do you have any papers related to the training of functional link
neural networks ?
A; Yes.
H.C. Yau and M.T. Manry, "Sigma-Pi Implementation of a Nearest
Neighbor Classifier," Proceedings of IJCNN 90, San Diego, I-667 to
I-672.
H.C. Yau and M.T. Manry, "Sigma-Pi Implementation of a Gaussian
Classifier," Proceedings of IJCNN 90, San Diego, III-825 to
III-830.
H.C. Yau and M.T. Manry, "Shape Recognition Using Sigma-Pi Neural
Networks," Proc. of IJCNN 91, Seattle WA., p. II A-934.
H.C. Yau and M.T. Manry, "Shape Recognition with Nearest Neighbor
Isomorphic Network," Proceedings of the First IEEE-SP Workshop on
Neural Networks for Signal Processing, Princeton, New Jersey, Sept.
29 - Oct. 2, 1991, pp. 246-255.
H.C. Yau and M.T. Manry, "Iterative Improvement of a Gaussian
Classifier," Neural Networks, Vol. 3, pp. 437-443, July 1990.
H.C. Yau and M.T. Manry, "Iterative Improvement of a Nearest
Neighbor Classifier," Neural Networks, Vol. 4, Number 4, pp.
517-524, 1991.